Evaluating Crowdsourcing Participants in the Absence of Ground-Truth

نویسندگان

  • Subramanian Ramanathan
  • Rómer Rosales
  • Glenn Fung
  • Jennifer G. Dy
چکیده

Data can be acquired, shared, and processed by an increasingly larger number of entities, in particular people. The distributed nature of this phenomenon has contributed to the development of many crowdsourcing projects. This scenario is prevalent in most forms of expert/non-expert group opinion and rating tasks (including many forms of internet or on-line user behavior), where a key element is the aggregation of observations-opinions from multiple sources.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Crowdsourcing Labels for Pathological Patterns in CT Lung Scans: Can Non-experts Contribute Expert-Quality Ground Truth?

This paper investigates what quality of ground truth might be obtained when crowdsourcing specialist medical imaging ground truth from non-experts. Following basic tuition, 34 volunteer participants independently delineated regions belonging to 7 pathological patterns in 20 scans according to expert-provided pattern labels. Participants’ annotations were compared to a set of reference annotatio...

متن کامل

Crowdsourcing Disagreement for Collecting Semantic Annotation

This paper proposes an approach to gathering semantic annotation, which rejects the notion that human interpretation can have a single ground truth, and is instead based on the observation that disagreement between annotators can signal ambiguity in the input text, as well as how the annotation task has been designed. The purpose of this research is to investigate whether disagreement-aware cro...

متن کامل

Detecting Crowdturfing in Social Media

Astroturfing Astroturfing is the campaign that masks its supporters and sponsors to make it appear to be launched by grassroots participants. Crowdsourcing Crowdsourcing is the process of obtaining needed services, ideas, or content by soliciting contributions from a group of people. Internet services facilitate the process by connecting customers and crowdsourcing workers. Ground truth Ground ...

متن کامل

Multi-Label Annotation Aggregation in Crowdsourcing

As a means of human-based computation, crowdsourcing has been widely used to annotate large-scale unlabeled datasets. One of the obvious challenges is how to aggregate these possibly noisy labels provided by a set of heterogeneous annotators. Another challenge stems from the difficulty in evaluating the annotator reliability without even knowing the ground truth, which can be used to build ince...

متن کامل

Crowdsourcing Ambiguity-Aware Ground Truth

The process of gathering ground truth data through human annotation is a major bottleneck in the use of information extraction methods. Crowdsourcing-based approaches are gaining popularity in the attempt to solve the issues related to volume of data and lack of annotators. Typically these practices use inter-annotator agreement as a measure of quality. However, this assumption often creates is...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1605.09432  شماره 

صفحات  -

تاریخ انتشار 2016